Bayesian inference in probabilistic graphical models

نویسنده

  • Felix Leopoldo Rios
چکیده

This thesis consists of four papers studying structure learning and Bayesian inference in probabilistic graphical models for both undirected and directed acyclic graphs (DAGs). Paper A presents a novel algorithm, called the Christmas tree algorithm (CTA), that incrementally construct junction trees for decomposable graphs by adding one node at a time to the underlying graph. We prove that CTA with positive probability is able to generate all junction trees of any given number of underlying nodes. Importantly for practical applications, we show that the transition probability of the CTA kernel has a computationally tractable expression. Applications of the CTA transition kernel are demonstrated in a sequential Monte Carlo (SMC) setting for counting the number of decomposable graphs. Paper B presents the SMC scheme in a more general setting specifically designed for approximating distributions over decomposable graphs. The transition kernel from CTA from Paper A is incorporated as proposal kernel. To improve the traditional SMC algorithm, a particle Gibbs sampler with a systematic refreshment step is further proposed. A simulation study is performed for approximate graph posterior inference within both log-linear and decomposable Gaussian graphical models showing efficiency of the suggested methodology in both cases. Paper C explores the particle Gibbs sampling scheme of Paper B for approximate posterior computations in the Bayesian predictive classification framework. Specifically, Bayesian model averaging (BMA) based on the posterior exploration of the class-specific model is incorporated into the predictive classifier to take full account of the model uncertainty. For each class, the dependence structure underlying the observed features is represented by a distribution over the space of decomposable graphs. Due to the intractability of explicit expression, averaging over the approximated graph posterior is performed. The proposed BMA classifier reveals superior performance compared to the ordinary Bayesian predictive classifier that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers. Paper D develops a novel prior distribution over DAGs with the ability to express prior knowledge in terms of graph layerings. In conjunction with the prior, a stochastic optimization algorithm based on the layering property of DAGs is developed for performing structure learning in Bayesian networks. A simulation study shows that the algorithm along with the prior has superior performance compared with existing priors when used for learning graph with a clearly layered structure.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Rule-based joint fuzzy and probabilistic networks

One of the important challenges in Graphical models is the problem of dealing with the uncertainties in the problem. Among graphical networks, fuzzy cognitive map is only capable of modeling fuzzy uncertainty and the Bayesian network is only capable of modeling probabilistic uncertainty. In many real issues, we are faced with both fuzzy and probabilistic uncertainties. In these cases, the propo...

متن کامل

An Introduction to Inference and Learning in Bayesian Networks

Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...

متن کامل

Introduction to Probabilistic Graphical Models

Over the last decades, probabilistic graphical models have become the method of choice for representing uncertainty in machine learning. They are used in many research areas such as computer vision, speech processing, time-series and sequential data modelling, cognitive science, bioinformatics, probabilistic robotics, signal processing, communications and error-correcting coding theory, and in ...

متن کامل

A Survey on Learning Bayesian Networks and Probabilistic Models of Cognition

In this paper, we present a collection of studies on learning Bayesian networks and one of its applications probabilistic models of human cognition. A Bayesian network is an encoding of probabilistic relationships among random variables via graphical models. Probabilistic models of cognition aim to explain human cognition by depending on the principles of probability theory and statistics. Prob...

متن کامل

Probabilistic inference in graphical models

Jordan and Weiss: Probabilistic inference in graphical models 1 INTRODUCTION A " graphical model " is a type of probabilistic network that has roots in several different framework provides a clean mathematical formalism that has made it possible to understand the relationships among a wide variety of network-based approaches to computation, and in particular to understand many neural network al...

متن کامل

Accelerating Inference: towards a full Language, Compiler and Hardware stack

We introduce Dimple, a fully open-source API for probabilistic modeling. Dimple allows the user to specify probabilistic models in the form of graphical models, Bayesian networks, or factor graphs, and performs inference (by automatically deriving an inference engine from a variety of algorithms) on the model. Dimple also serves as a compiler for GP5, a hardware accelerator for inference.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017